Skip to content

Conversation

@emergenz
Copy link

@emergenz emergenz commented Dec 31, 2025

We are currently using our own LoRA implementation, while miles will rely on PEFT as soon as their implementation is merged into main. Thus we should migrate to PEFT. Also, we should match the code structure of the two PRs in order to avoid merge conflicts as much as possible down the line.

Miles LoRA PRs:

radixark#377

radixark#326

Changes

Aspect Old New (PR 326 compatible)
Enable LoRA --use-lora flag --lora-rank > 0
Target modules --lora-target-modules --target-modules
LoRA detection args.use_lora is_lora_model(model)
Checkpoint dir checkpoint_dir/ checkpoint_dir/adapter/
State dict key "model" "adapter" (when LoRA)
LoRA library Custom LoRALinear PEFT library

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants